prosthetic hand
Vision-Guided Grasp Planning for Prosthetic Hands in Unstructured Environments
Sulaiman, Shifa, Bachhar, Akash, Shen, Ming, Bøgh, Simon
Recent advancements in prosthetic technology have increasingly focused on enhancing dexterity and autonomy through intelligent control systems. Vision-based approaches offer promising results for enabling prosthetic hands to interact more naturally with diverse objects in dynamic environments. Building on this foundation, the paper presents a vision-guided grasping algorithm for a prosthetic hand, integrating perception, planning, and control for dexterous manipulation. A camera mounted on the set up captures the scene, and a Bounding Volume Hierarchy (BVH)-based vision algorithm is employed to segment an object for grasping and define its bounding box. Grasp contact points are then computed by generating candidate trajectories using Rapidly-exploring Random Tree Star algorithm, and selecting fingertip end poses based on the minimum Euclidean distance between these trajectories and the objects point cloud. Each finger grasp pose is determined independently, enabling adaptive, object-specific configurations. Damped Least Square (DLS) based Inverse kinematics solver is used to compute the corresponding joint angles, which are subsequently transmitted to the finger actuators for execution. This modular pipeline enables per-finger grasp planning and supports real-time adaptability in unstructured environments. The proposed method is validated in simulation, and experimental integration on a Linker Hand O7 platform.
- Information Technology > Artificial Intelligence > Vision (1.00)
- Information Technology > Artificial Intelligence > Robots > Manipulation (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning (1.00)
PLEXUS Hand: Lightweight Four-Motor Prosthetic Hand Enabling Precision-Lateral Dexterous Manipulation
Kuroda, Yuki, Takahashi, Tomoya, Beltran-Hernandez, Cristian C, Hamaya, Masashi, Tanaka, Kazutoshi
Electric prosthetic hands should be lightweight to decrease the burden on the user, shaped like human hands for cosmetic purposes, and have motors inside to protect them from damage and dirt. In addition to the ability to perform daily activities, these features are essential for everyday use of the hand. In-hand manipulation is necessary to perform daily activities such as transitioning between different postures, particularly through rotational movements, such as reorienting cards before slot insertion and operating tools such as screwdrivers. However, currently used electric prosthetic hands only achieve static grasp postures, and existing manipulation approaches require either many motors, which makes the prosthesis heavy for daily use in the hand, or complex mechanisms that demand a large internal space and force external motor placement, complicating attachment and exposing the components to damage. Alternatively, we combine a single-axis thumb and optimized thumb positioning to achieve basic posture and in-hand manipulation, that is, the reorientation between precision and lateral grasps, using only four motors in a lightweight (311 g) prosthetic hand. Experimental validation using primitive objects of various widths (5-30 mm) and shapes (cylinders and prisms) resulted in success rates of 90-100% for reorientation tasks. The hand performed seal stamping and USB device insertion, as well as rotation to operate a screwdriver.
HannesImitation: Grasping with the Hannes Prosthetic Hand via Imitation Learning
Alessi, Carlo, Vasile, Federico, Ceola, Federico, Pasquale, Giulia, Boccardo, Nicolò, Natale, Lorenzo
Recent advancements in control of prosthetic hands have focused on increasing autonomy through the use of cameras and other sensory inputs. These systems aim to reduce the cognitive load on the user by automatically controlling certain degrees of freedom. In robotics, imitation learning has emerged as a promising approach for learning grasping and complex manipulation tasks while simplifying data collection. Its application to the control of prosthetic hands remains, however, largely unexplored. Bridging this gap could enhance dexterity restoration and enable prosthetic devices to operate in more unconstrained scenarios, where tasks are learned from demonstrations rather than relying on manually annotated sequences. To this end, we present HannesImitationPolicy, an imitation learning-based method to control the Hannes prosthetic hand, enabling object grasping in unstructured environments. Moreover, we introduce the HannesImitationDataset comprising grasping demonstrations in table, shelf, and human-to-prosthesis handover scenarios. We leverage such data to train a single diffusion policy and deploy it on the prosthetic hand to predict the wrist orientation and hand closure for grasping. Experimental evaluation demonstrates successful grasps across diverse objects and conditions. Finally, we show that the policy outperforms a segmentation-based visual servo controller in unstructured scenarios. Additional material is provided on our project page: https://hsp-iit.github.io/HannesImitation
- Europe > Italy > Liguria > Genoa (0.04)
- North America > United States (0.04)
- Europe > United Kingdom > England > Buckinghamshire > Milton Keynes (0.04)
- Europe > Finland > Uusimaa > Helsinki (0.04)
Towards Biosignals-Free Autonomous Prosthetic Hand Control via Imitation Learning
Shi, Kaijie, Lu, Wanglong, Zhao, Hanli, da Fonseca, Vinicius Prado, Zou, Ting, Jiang, Xianta
-- Limb loss affects millions globally, impairing physical function and reducing quality of life. Most traditional surface electromyographic (sEMG) and semi-autonomous methods require users to generate myoelec-tric signals for each control, imposing physically and mentally taxing demands. This study aims to develop a fully autonomous control system that enables a prosthetic hand to automatically grasp and release objects of various shapes using only a camera attached to the wrist. By placing the hand near an object, the system will automatically execute grasping actions with a proper grip force in response to the hand's movements and the environment. To release the object being grasped, just naturally place the object close to the table and the system will automatically open the hand. Such a system would provide individuals with limb loss with a very easy-to-use prosthetic control interface and greatly reduce mental effort while using. To achieve this goal, we developed a teleoperation system to collect human demonstration data for training the prosthetic hand control model using imitation learning, which mimics the prosthetic hand actions from human. Through training the model using only a few objects' data from one single participant, we have shown that the imitation learning algorithm can achieve high success rates, generalizing to more individuals and unseen This work has been submitted to the IEEE for possible publication. This work was supported in part by the Government of Canada's New Frontiers in Research Fund (NFRF, Grant No NFRFE-2022-00407) and Natural Sciences and Engineering Research Council of Canada's Research T ools and Instruments (NSERC RTI, Grant No RTI-2022-00688). This work involved human subjects or animals in its research. Approval of all ethical and experimental procedures and protocols was granted by the Memorial University Interdisciplinary Committee on Ethics in Human Research (20210316-SC). Kaijie Shi, Wanglong Lu are with Department of Computer Science, Memorial University of Newfoundland, St. John's, NL A1B 3X5, Canada, and also with College of Computer Science and Artificial Intelligence, Wenzhou University, Wenzhou, 325000, China. Hanli Zhao is with College of Computer Science and Artificial Intelligence, Wenzhou University, Wenzhou, 325000, China. Vinicius Prado da Fonseca is with Department of Computer Science, Memorial University of Newfoundland, St. John's, NL A1B 3X5, Canada. Ting Zou is with Department of Mechanical and Mechatronics Engineering, Memorial University of Newfoundland, St. John's, NL A1B 3X5, Canada.
- North America > Canada > Newfoundland and Labrador > Newfoundland > St. John's (0.64)
- Asia > China (0.44)
- Research Report > Experimental Study (1.00)
- Research Report > New Finding (0.93)
- Health & Medicine > Health Care Technology (1.00)
- Government > Regional Government > North America Government (0.34)
World's first touch-sensing bionic hand with lightning-fast response
Tech expert Kurt Knutsson says the Ability Hand brings real touch, natural movement and unmatched durability. Losing a hand or limb is a life-changing event, and finding a prosthetic that can truly feel has long been a challenge. For many, traditional prosthetics offer limited movement and no sense of touch, making everyday tasks difficult and frustrating. But what if a prosthetic hand could do more than just move? What if it could actually feel the objects you touch, giving you real-time feedback and control?
A Vision-Enabled Prosthetic Hand for Children with Upper Limb Disabilities
Sarker, Md Abdul Baset, Nguyen, Art, Kukla, Sigmond, Fite, Kevin, Imtiaz, Masudul H.
Digital Object Identifier preprint A Vision-Enabled Prosthetic Hand for Children with Upper Limb Disabilities MD ABDUL BASET SARKER 1, ART NGUYEN 2, SIGMOND KUKLA 3, KEVIN FITE 4, MASUDUL H. IMTIAZ 5 1 Md Abdul Baset Sarker is with Clarkson University, Potsdam, NY-13699, USA (e-mail: sarkerm@clarkson.edu). 2 Art Nguyen is with Clarkson University, Potsdam, NY-13699, USA (e-mail: nguyenqp@clarkson.edu). Paper submission date Mar 31, 2025' This work was supported in part by Clarkson University, Potsdam, NYABSTRACT This paper introduces a novel AI vision-enabled pediatric prosthetic hand designed to assist children aged 10-12 with upper limb disabilities. The prosthesis features an anthropomorphic appearance, multi-articulating functionality, and a lightweight design that mimics a natural hand, making it both accessible and affordable for low-income families. Using 3D printing technology and integrating advanced machine vision, sensing, and embedded computing, the prosthetic hand offers a low-cost, customizable solution that addresses the limitations of current myoelectric prostheses. A micro camera is interfaced with a low-power FPGA for real-time object detection and assists with precise grasping. The onboard DL-based object detection and grasp classification models achieved accuracies of 96% and 100% respectively. In the force prediction, the mean absolute error was found to be 0.018. The features of the proposed prosthetic hand can thus be summarized as: a) a wrist-mounted micro camera for artificial sensing, enabling a wide range of hand-based tasks; b) real-time object detection and distance estimation for precise grasping; and c) ultra-low-power operation that delivers high performance within constrained power and resource limits.INDEX TERMS artificial intelligence, prosthetic hand, rehabilitation, vision I. INTRODUCTION C ONGENITAL limb loss and upper extremity abnormalities were estimated to occur in approximately 15 individuals per 100,000 live births in the United States alone [1], [2]. Beyond congenital disabilities, tumors, severe infections, or traumatic injuries also cause pediatric limb deficiency and place a significant physical and emotional burden on a child and their family. Replacement of an upper limb with a functional prosthetic hand had the potential to restore some limb functionality and improve the independence of these children. Furthermore, the earlier children were fitted for a powered prosthesis, the lower the rate of prosthetic hand rejection in the later years of their life [3].
- Europe > Germany > Brandenburg > Potsdam (0.65)
- North America > United States > Alabama (0.04)
- North America > United States > New York > Albany County > Albany (0.04)
- Asia > Bangladesh > Dhaka Division > Dhaka District > Dhaka (0.04)
Smart Ankleband for Plug-and-Play Hand-Prosthetic Control
Zadok, Dean, Salzman, Oren, Wolf, Alon, Bronstein, Alex M.
Building robotic prostheses requires the creation of a sensor-based interface designed to provide the robotic hand with the control required to perform hand gestures. Traditional Electromyography (EMG) based prosthetics and emerging alternatives often face limitations such as muscle-activation limitations, high cost, and complex-calibration procedures. In this paper, we present a low-cost robotic system composed of a smart ankleband for intuitive, calibration-free control of a robotic hand, and a robotic prosthetic hand that executes actions corresponding to leg gestures. The ankleband integrates an Inertial Measurement Unit (IMU) sensor with a lightweight temporal neural network to infer user-intended leg gestures from motion data. Our system represents a significant step towards higher adoption rates of robotic prostheses among arm amputees, as it enables one to operate a prosthetic hand using a low-cost, low-power, and calibration-free solution. To evaluate our work, we collected data from 10 subjects and tested our prototype ankleband with a robotic hand on an individual with upper-limb amputations. Our results demonstrate that this system empowers users to perform daily tasks more efficiently, requiring few compensatory movements.
- Health & Medicine > Therapeutic Area > Neurology (1.00)
- Health & Medicine > Health Care Technology (1.00)
Soft Everting Prosthetic Hand and Comparison with Existing Body-Powered Terminal Devices
Park, Gayoung, Schäffer, Katalin, Coad, Margaret M.
-- In this paper, we explore the use of a soft gripper, specifically a soft inverting-everting toroidal hydrostat, as a prosthetic hand. We present a design of the gripper integrated into a body-powered elbow-driven system and evaluate its performance compared to similar body-powered terminal devices: the Kwawu 3D-printed hand and the Hosmer hook. Our experiments highlight advantages of the Everting hand, such as low required cable tension for operation (1.6 N for Everting, 30.0 N for Kwawu, 28.1 N for Hosmer), limited restriction on the elbow angle range, and secure grasping capability (peak pulling force required to remove an object: 15.8 N for Everting, 6.9 N for Kwawu, 4.0 N for Hosmer). In our pilot user study, six able-bodied participants performed standardized hand dexterity tests. With the Everting hand compared to the Kwawu hand, users transferred more blocks in one minute and completed three tasks (moving small common objects, simulated feeding with a spoon, and moving large empty cans) faster (p 0.05). With the Everting hand compared to the Hosmer hook, users moved large empty cans faster (p 0.05) and achieved similar performance on all other tasks. Overall, user preference leaned toward the Everting hand for its adaptable grip and ease of use, although its abilities could be improved in tasks requiring high precision such as writing with a pen, and in handling heavier objects such as large heavy cans. For individuals with limb difference that affects their hand function, prosthetic hands have the potential to restore their ability to achieve everyday tasks [1], [2].
- North America > United States > South Carolina (0.04)
- North America > United States > Minnesota (0.04)
- North America > United States > Indiana > St. Joseph County > Notre Dame (0.04)
- (3 more...)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
Soft robotic prosthetic hand uses nerve signals for more natural control
The approach combines the natural coordination patterns of our fingers with the decoding of motoneuron activity in the spinal column. Recent advancements in technology have revolutionized the world of assistive and medical tools, and prosthetic limbs are no exception. We've come a long way from the rigid, purely cosmetic prosthetics of the past. Today, we're seeing the rise of softer, more realistic designs, many incorporating robotic components that significantly expand their functionality. Despite these exciting developments, a major challenge remains: How do we make these robotic limbs easier and more intuitive for users to control?
Continuous Wrist Control on the Hannes Prosthesis: a Vision-based Shared Autonomy Framework
Vasile, Federico, Maiettini, Elisa, Pasquale, Giulia, Boccardo, Nicolò, Natale, Lorenzo
Most control techniques for prosthetic grasping focus on dexterous fingers control, but overlook the wrist motion. This forces the user to perform compensatory movements with the elbow, shoulder and hip to adapt the wrist for grasping. We propose a computer vision-based system that leverages the collaboration between the user and an automatic system in a shared autonomy framework, to perform continuous control of the wrist degrees of freedom in a prosthetic arm, promoting a more natural approach-to-grasp motion. Our pipeline allows to seamlessly control the prosthetic wrist to follow the target object and finally orient it for grasping according to the user intent. We assess the effectiveness of each system component through quantitative analysis and finally deploy our method on the Hannes prosthetic arm. Code and videos: https://hsp-iit.github.io/hannes-wrist-control.
- Europe > Italy > Liguria > Genoa (0.05)
- Europe > United Kingdom > England > Buckinghamshire > Milton Keynes (0.04)